|
In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If ''P'' and ''Q'' are probability distributions on the real line, such that ''P'' is absolutely continuous with respect to ''Q'', i.e. ''P''<<''Q'', and whose first moments exist, then : where is the rate function, i.e. the convex conjugate of the cumulant-generating function, of , and is the first moment of The Cramér–Rao bound is a corollary of this result. ==Proof== Let ''P'' and ''Q'' be probability distributions (measures) on the real line, whose first moments exist, and such that ''P''<<''Q''. Consider the natural exponential family of ''Q'' given by : for every measurable set ''A'', where is the moment-generating function of ''Q''. (Note that ''Q''0=''Q''.) Then : By Gibbs' inequality we have so that : Simplifying the right side, we have, for every real θ where : where is the first moment, or mean, of ''P'', and is called the cumulant-generating function. Taking the supremum completes the process of convex conjugation and yields the rate function: : 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Kullback's inequality」の詳細全文を読む スポンサード リンク
|